Statistical Inference for Rényi Entropy Functionals
نویسندگان
چکیده
Numerous entropy-type characteristics (functionals) generalizing Rényi entropy are widely used in mathematical statistics, physics, information theory, and signal processing for characterizing uncertainty in probability distributions and distribution identification problems. We consider estimators of some entropy (integral) functionals for discrete and continuous distributions based on the number of epsilon-close vector records in the corresponding independent and identically distributed samples from two distributions. The estimators form a triangular scheme of generalized U -statistics. We show the asymptotic properties of these estimators (e.g., consistency and asymptotic normality). The results can be applied in various problems in computer science and mathematical statistics (e.g., approximate matching for random databases, record linkage, image matching). AMS 2000 subject classification: 94A15, 62G20
منابع مشابه
The Rate of Rényi Entropy for Irreducible Markov Chains
In this paper, we obtain the Rényi entropy rate for irreducible-aperiodic Markov chains with countable state space, using the theory of countable nonnegative matrices. We also obtain the bound for the rate of Rényi entropy of an irreducible Markov chain. Finally, we show that the bound for the Rényi entropy rate is the Shannon entropy rate.
متن کاملA Preferred Definition of Conditional Rényi Entropy
The Rényi entropy is a generalization of Shannon entropy to a one-parameter family of entropies. Tsallis entropy too is a generalization of Shannon entropy. The measure for Tsallis entropy is non-logarithmic. After the introduction of Shannon entropy , the conditional Shannon entropy was derived and its properties became known. Also, for Tsallis entropy, the conditional entropy was introduced a...
متن کاملOn some entropy functionals derived from Rényi information divergence
We consider the maximum entropy problems associated with Rényi Q-entropy, subject to two kinds of constraints on expected values. The constraints considered are a constraint on the standard expectation, and a constraint on the generalized expectation as encountered in nonextensive statistics. The optimum maximum entropy probability distributions, which can exhibit a power-law behaviour, are der...
متن کاملStatistical Inference for Quadratic Rényi Entropy
Abstract: Entropy and its various generalizations are widely used in mathematical statistics, communication theory, physical and computer sciences for characterizing the amount of information in a probability distribution. We consider estimators of the quadratic Rényi entropy and some related characteristics of discrete and continuous probability distributions based on the number of coincident ...
متن کاملOn the robustness of q-expectation values and Rényi entropy
We study the robustness of functionals of probability distributions such as the Rényi and nonadditive Sq entropies, as well as the q-expectation values under small variations of the distributions. We focus on three important types of distribution functions, namely (i) continuous bounded (ii) discrete with finite number of states, and (iii) discrete with infinite number of states. The physical c...
متن کامل